# Academic research
Llm Jp 3.1 1.8b
Apache-2.0
LLM-jp-3.1-1.8b is a large language model developed by the National Institute of Informatics in Japan. Based on the LLM-jp-3 series, it incorporates instruction pre-training to enhance the instruction-following ability.
Large Language Model
Transformers Supports Multiple Languages

L
llm-jp
572
1
Llm Jp 3.1 13b Instruct4
Apache-2.0
LLM-jp-3.1-13b-instruct4 is a large language model developed by the National Institute of Informatics in Japan. It significantly improves the instruction-following ability through instruction pre-training and supports multiple languages such as Japanese and English.
Large Language Model
Transformers Supports Multiple Languages

L
llm-jp
176
7
Llama 3.3 Swallow 70B V0.4
Llama 3.3 Swallow is a large language model with 70 billion parameters, built on Meta Llama 3.3, which enhances Japanese capabilities while retaining English capabilities.
Large Language Model
Transformers Supports Multiple Languages

L
tokyotech-llm
1,950
3
Vinallama 7b Chat
VinaLLaMA is a large language model specifically optimized for Vietnamese, built on the Llama2 architecture.
Large Language Model
Transformers Other

V
vilm
519
25
Lingowhale 8B
A Chinese-English bilingual large language model jointly open-sourced by DeepLang Tech and Tsinghua NLP Lab, pre-trained on trillions of high-quality tokens with 8K context window processing capability
Large Language Model
Transformers Supports Multiple Languages

L
deeplang-ai
98
21
Open Cabrita3b GGUF
Apache-2.0
Open Cabrita 3B is an open-source large language model optimized for Portuguese, based on the LLaMA architecture, designed to narrow the performance gap between foreign language and English models.
Large Language Model Other
O
lucianosb
352
6
Llama 30b
Other
LLaMA-30b is a large language model with 30 billion parameters, suitable for various natural language processing tasks.
Large Language Model
Transformers

L
huggyllama
2,866
47
Llama 7b
Other
LLaMA-7b is an open-source large language model released by Meta, with a parameter scale of 7 billion, suitable for natural language processing tasks.
Large Language Model
Transformers

L
huggyllama
102.76k
326
Bert Base Chinese Ner
Gpl-3.0
Provides Traditional Chinese transformers models and natural language processing tools
Sequence Labeling Chinese
B
ckiplab
17.95k
117
Featured Recommended AI Models